33 research outputs found
Agile software development in an earned value world: a survival guide
Agile methodologies are current best practice in software development. They are favored for, among other reasons, preventing premature optimization by taking a somewhat short-term focus, and allowing frequent replans/reprioritizations of upcoming development work based on recent results and current backlog. At the same time, funding agencies prescribe earned value management accounting for large projects which, these days, inevitably include substantial software components. Earned Value approaches emphasize a more comprehensive and typically longer-range plan, and tend to characterize frequent replans and reprioritizations as indicative of problems. Here we describe the planning, execution and reporting framework used by the LSST Data Management team, that navigates these opposite tensions
Software Architecture and System Design of Rubin Observatory
Starting from a description of the Rubin Observatory Data Management System
Architecture, and drawing on our experience with and involvement in a range of
other projects including Gaia, SDSS, UKIRT, and JCMT, we derive a series of
generic design patterns and lessons learned.Comment: 10 pages ADASS XXXII submissio
The LOFAR Transients Pipeline
Current and future astronomical survey facilities provide a remarkably rich
opportunity for transient astronomy, combining unprecedented fields of view
with high sensitivity and the ability to access previously unexplored
wavelength regimes. This is particularly true of LOFAR, a
recently-commissioned, low-frequency radio interferometer, based in the
Netherlands and with stations across Europe. The identification of and response
to transients is one of LOFAR's key science goals. However, the large data
volumes which LOFAR produces, combined with the scientific requirement for
rapid response, make automation essential. To support this, we have developed
the LOFAR Transients Pipeline, or TraP. The TraP ingests multi-frequency image
data from LOFAR or other instruments and searches it for transients and
variables, providing automatic alerts of significant detections and populating
a lightcurve database for further analysis by astronomers. Here, we discuss the
scientific goals of the TraP and how it has been designed to meet them. We
describe its implementation, including both the algorithms adopted to maximize
performance as well as the development methodology used to ensure it is robust
and reliable, particularly in the presence of artefacts typical of radio
astronomy imaging. Finally, we report on a series of tests of the pipeline
carried out using simulated LOFAR observations with a known population of
transients.Comment: 30 pages, 11 figures; Accepted for publication in Astronomy &
Computing; Code at https://github.com/transientskp/tk
The Hyper Suprime-Cam Software Pipeline
In this paper, we describe the optical imaging data processing pipeline
developed for the Subaru Telescope's Hyper Suprime-Cam (HSC) instrument. The
HSC Pipeline builds on the prototype pipeline being developed by the Large
Synoptic Survey Telescope's Data Management system, adding customizations for
HSC, large-scale processing capabilities, and novel algorithms that have since
been reincorporated into the LSST codebase. While designed primarily to reduce
HSC Subaru Strategic Program (SSP) data, it is also the recommended pipeline
for reducing general-observer HSC data. The HSC pipeline includes high level
processing steps that generate coadded images and science-ready catalogs as
well as low-level detrending and image characterizations.Comment: 39 pages, 21 figures, 2 tables. Submitted to Publications of the
Astronomical Society of Japa
Agile software development in an earned value world: a survival guide
Agile methodologies are current best practice in software development. They are favored for, among other reasons, preventing premature optimization by taking a somewhat short-term focus, and allowing frequent replans/reprioritizations of upcoming development work based on recent results and current backlog. At the same time, funding agencies prescribe earned value management accounting for large projects which, these days, inevitably include substantial software components. Earned Value approaches emphasize a more comprehensive and typically longer-range plan, and tend to characterize frequent replans and reprioritizations as indicative of problems. Here we describe the planning, execution and reporting framework used by the LSST Data Management team, that navigates these opposite tensions
The behaviour of dark matter associated with four bright cluster galaxies in the 10kpc core of Abell 3827
Galaxy cluster Abell 3827 hosts the stellar remnants of four almost equally bright elliptical galaxies within a core of radius 10kpc. Such corrugation of the stellar distribution is very rare, and suggests recent formation by several simultaneous mergers. We map the distribution of associated dark matter, using new Hubble Space Telescope imaging andVery Large Telescope/Multi-Unit Spectroscopic Explorer integral field spectroscopy of a gravitationally lensed system threaded through the cluster core. We find that each of the central galaxies retains a dark matter halo, but that (at least) one of these is spatially offset from its stars. The best-constrained offset is kpc, where the 68âperâcent confidence limit includes both statistical error and systematic biases in mass modelling. Such offsets are not seen in field galaxies, but are predicted during the long infall to a cluster, if dark matter self-interactions generate an extra drag force. With such a small physical separation, it is difficult to definitively rule out astrophysical effects operating exclusively in dense cluster core environments - but if interpreted solely as evidence for self-interacting dark matter, this offset implies a cross-section ÏDM/mâŒ(1.7±0.7)Ă10â4cm2gâ1Ă(tinfall/109âyr)â2, where tinfall is the infall duratio
A Physical Model for z~2 Dust Obscured Galaxies
We present a physical model for the origin of z~2 Dust-Obscured Galaxies
(DOGs), a class of high-redshift ULIRGs selected at 24 micron which are
particularly optically faint (24/R>1000). By combining N-body/SPH simulations
of high redshift galaxy evolution with 3D polychromatic dust radiative transfer
models, we find that luminous DOGs (with F24 > 0.3 mJy at z~2 are well-modeled
as extreme gas-rich mergers in massive (~5x10^12-10^13 Msun) halos, with
elevated star formation rates (~500-1000 Msun/yr) and/or significant AGN growth
(Mdot > 0.5 Msun/yr), whereas less luminous DOGs are more diverse in nature. At
final coalescence, merger-driven DOGs transition from being starburst dominated
to AGN dominated, evolving from a "bump" to a power-law shaped mid-IR (IRAC)
spectral energy distribution (SED). After the DOG phase, the galaxy settles
back to exhibiting a "bump" SED with bluer colors and lower star formation
rates. While canonically power-law galaxies are associated with being
AGN-dominated, we find that the power-law mid-IR SED can owe both to direct AGN
contribution, as well as to a heavily dust obscured stellar bump at times that
the galaxy is starburst dominated. Thus power-law galaxies can be either
starburst or AGN dominated. Less luminous DOGs can be well-represented either
by mergers, or by massive ($M_{\rm baryon} ~5x10^11 Msun) secularly evolving
gas-rich disc galaxies (with SFR > 50 Msun/yr). By utilising similar models as
those employed in the SMG formation study of Narayanan et al. (2010), we
investigate the connection between DOGs and SMGs. We find that the most heavily
star-forming merger driven DOGs can be selected as Submillimetre Galaxies
(SMGs), while both merger-driven and secularly evolving DOGs typically satisfy
the BzK selection criteria.Comment: Accepted by MNRAS; major changes include better description of
dependency on ISM specification and updated models allowing dust to evolve
with metallicity